Skip to content

Conversation

@apappascs
Copy link
Contributor

@apappascs apappascs commented Feb 7, 2025

This PR resolves: #2283 by adding reasoning content support to OpenAiChatModel and related classes, following the approach of this 45421b1 .

Reasoning Content Details

reasoning_content : string / nullable  
For DeepSeek-Reasoner model only. Represents the reasoning contents of the assistant's message before the final answer.  

Refer to the DeepSeek API documentation for more details: API Docs.

Changes in This PR
• Added reasoningContent field to metadata in OpenAiChatModel.
• Updated ChatCompletionMessage to include reasoningContent.

With this implementation, we can build projects like DeepClaude, enabling the integration of DeepSeek R1’s logical reasoning capabilities with Anthropic Claude’s (or any other model’s) creative and coding prowess. This will provide a powerful, combined language model experience.

@apappascs apappascs force-pushed the feature/add-reasoning-content-support branch from 33c997c to e3947c2 Compare February 7, 2025 17:41
…For deepseek-reasoner https://api-docs.deepseek.com/api/create-chat-completion)

- Added reasoningContent field to metadata in OpenAiChatModel
- Updated ChatCompletionMessage to include reasoningContent
- Modified OpenAiStreamFunctionCallingHelper to handle reasoningContent
- Updated tests to verify reasoningContent functionality

Signed-off-by: Alexandros Pappas <[email protected]>
Copy link

@yangshaotuo yangshaotuo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When can it be officially used

@markpollack
Copy link
Member

I think the number of changes that are occuring to make the openai model compatible with other models is starting to go too far, divergence starts to seem inevitable. We should look to make it easy to extend the openai model support for different dedicated variations with minimal code, as compared to the current full blown cut and paste approach. Moving this to the RC1 release so that we can better support deepseek as it's own first class model.

@markpollack markpollack modified the milestones: 1.0.0-M7, 1.0.0-RC1 Apr 4, 2025
@apappascs
Copy link
Contributor Author

I think the number of changes that are occuring to make the openai model compatible with other models is starting to go too far, divergence starts to seem inevitable. We should look to make it easy to extend the openai model support for different dedicated variations with minimal code, as compared to the current full blown cut and paste approach. Moving this to the RC1 release so that we can better support deepseek as it's own first class model.

Hi @markpollack

Thanks for the feedback on the PR—makes total sense.

Just to make sure I fully understand your intention before moving forward—are you suggesting that instead of adapting the existing spring-ai-openai module for DeepSeek compatibility, we should treat DeepSeek as its own first-class model (like spring-ai-mistral-ai, spring-ai-moonshot, etc.) and place it under models/spring-ai-deepseek or something similar?

Let me know if that’s the direction you had in mind, or if you’d prefer another approach. Once confirmed, I’m happy to take the lead on the implementation.

Thanks again!

@LinkCaau
Copy link

LinkCaau commented Apr 11, 2025

When using OpenAI compatibility mode, some models may extend the protocol, and the rapid development of large models makes it difficult for frameworks like Spring AI to adapt quickly and flexibly. Is there a better way to achieve a permanent solution?
Examples:

  1. DeepSeek added a reasoning_content field.
  2. QianFan finish_reason field add a new type normal, but since the type is fixed as ChatCompletionFinishReason, it causes serialization failures.

@markpollack
Copy link
Member

Hi @apappascs

yes, you understood me correctly.

Please take a look at the comment here - #2116 (comment)

and let me know how you want to proceed. As I'm going through the issues and doing triage, I see more and more of these deepseek issues that suggest it may take a bit of time to get the implementation stable in time for RC1 next May 13

I suggest first going into the spring-ai-community and then moving it over into spring-ai once it matures.

@mxsl-gr

@apappascs
Copy link
Contributor Author

Hi @markpollack,

Thanks for confirming.

I’ll get started on it. I believe we can have the implementation ready directly in spring-ai before the RC1 deadline on May 13. The spring-ai-community route does seem like a safer fallback if needed, but the goal is to have it solid enough to go straight into the main project.

@markpollack
Copy link
Member

@apappascs it seems like a community project to get active incubation and feedback would be the most effective route. I suspect we will eventually need a ReasoningClient as well - will be interesting to see how it turns out.

https://github.com/spring-ai-community/community/issues

@markpollack
Copy link
Member

@mxsl-gr @apappascs There is now a deepseek model in spring ai to handle this without interfering with the OpenAiChatModel. Closing this as any changes should be in that part of the code base.

thanks!

@markpollack markpollack closed this May 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Output of reasoning_content that supports deepseek-r1 is required

5 participants